All Grants | Cycle 3 –
Ehtel-Ruthe Tawe is an anti-disciplinary artist and creative researcher whose work explores memory in Africa and its diaspora. She uses photography, collage, text, moving image, installation, and other time-based media to examine culture and technology, often through a speculative lens. Her curatorial practice began with the exhibition African Ancient Futures and now encompasses diverse audiovisual experiments. Tawe is Editor-in-Chief at Contemporary And (C&) Magazine.
Tawe has consulted on and produced global campaigns for international cultural platforms such as ART X Lagos and the Global Alliance for Green and Gender Action. She has conducted brand talent research for Nike and Johnnie Walker, as well as film/image research for selections shown at Tribeca Film Festival, Blackstar Fest and The Barbican Center, among others. She is a contributor to the British Journal of Photography and Quartz Feminists, among other publications. She was previously Editorial Director for the arts and culture platform Africa 2.O Magazine. Tawe holds an MSc in Development Studies from SOAS University of London and a BA (Hons.) in International Human Rights.
Links
Profile Links
Ethel-Ruth Tawe’s Website
Research / Art Practice Links
“Technology on Loop: An alternative history of algorithms through Double Exposures” (Trigger, FoMu, 16 November 2023)
“Image Frequency Modulation” (Black Discourse)
“Cosmic (Re)memory, Diaspora, and Other African Technologies” (Feminist, 14 December 2022)
Other Links
10×10 Research Grant Presentation Video (Ehtel-Ruth Tawe—00:49:35)
Summary of Research Supported by 10×10 Photobooks Grant:
The Algorithms of Colonial African Photobooks
Earlier this year, I presented an artistic response to Africa in The Photobook, a collection by photohistorian Ben Krewinkel, concerned with the visual representation of Africa as expressed through the medium of the photobook. Titled Double Exposures, the exhibition investigated the tensions between the folds of pages, the afterlives of images, their captions, and contexts. From the early use of propaganda in a troubling colonial archive to postcolonial technologies of capture, we asked questions with care about the lens through which photobooks are read today. Framing and bending its chronology, the exhibition included “footnotes” and an annotated bibliography within a time-spiral. I began to understand early African photobooks as functioning algorithmically, with a set of rules for surveillance in the colonial machine, muting the interior lives of those depicted. But what are the afterlives of these algorithms today, in the age of artificial intelligence and machine learning?

During my research, two particular artworks emerged in response to ethnographic photobooks, a genre that attempted to capture and categorize African “subjects.” Opacity (2023) is a counter-study on confrontational practices of visibility in colonial photobooks, using 48 replicated image plates in their original sequence from a portfolio book on people of the Malange and Lunda region of Angola. This work refutes and redacts common elements of compelled photography and propaganda of the time. The exposed plates with turned-backs are to be read as unintended signals of refusal and misidentification. Guests were invited to have haptic encounters, revealing photographs beneath transparent sheets, but only after posing new pertinent questions, the plates were now veiled with. A single-channel video projected into a book spread spoke to the same harms of early identification photography. “Typing…” is a quiet film on embodied gestures of refusal. A moving portrait of a Mangetu woman in 1970s Belgian Congo reveals signals of negation in the discomforting positions African sitters were often subjected to. It sits in contrast to the word “typing …,” a written text with ellipses alluding to the waiting of a message that never arrives, and the pseudoscientific attempt to dehumanize and catalog African people into rigid ‘types’.
Simply put, an algorithm is a set of rules that uses datasets (training data) for machine learning and building artificial intelligence. They are the foundations of the rapidly morphing digital space we are increasingly being programmed into. While highly sophisticated and seemingly ‘self-intelligent’, algorithms show a strong formulaic correlation with the functions and anatomy of colonial photobooks. The algorithmic bias and coded gaze of modern technologies can especially be paralleled to the programming techniques of ethnographic photobooks as instruments of empire. Their production and circulation fluctuated to serve the regulatory needs of the state and classificatory imperatives of colonization. From sequencing, scrolling, and haptics of the photobooks, the user experience has been projected into digital space to capitalize on and program the minds of the masses. Photobooks can be understood as early social media, seemingly open and free, but heavily investing in mining personal data.

According to Dr. Safiya Noble, author of ‘Algorithms of Oppression’, Google’s first digitization project involved mass scanning of books at libraries to build what she believes were the earliest datasets for machine learning and search engines. In her work, Dr. Noble interrogates examples of her searches for ‘Black girls’ resulting in primarily pornographic material, an extension of colonial African photobooks’ notorious fixation with the Black female body from a predominantly white male gaze. African women were disproportionately subjected to this gaze, hence their appointment as sitters for staged, exoticized, and eroticized images sent across the Atlantic to flaunt human conquest. Today, algorithms are still being built from a monolithic Western lens, while crowd-sourced data reflects the views of those with higher access to the internet and the ability to assert dominance in cyberspace as they do in real life. These practices are what I call ‘software/softwar’ for their ability to weaponize Black bodies, though this does not lack subversion.
Through my engagements with Mozilla, an organization dedicated to building trustworthy AI and a more ethical internet, I have gained a stronger interest in counter-theories such as ‘ancestral intelligence’ focused on reimagining the logic of these technologies and ways to respect our ‘data bodies’. How can we train AI algorithms with anti-colonial imagery and datasets? There is an evident gap in knowledge on the correlation between colonial photobooks as early datasets and their impact on algorithmic bias. With this research, I plan to dive into private and public collections, online libraries, and other unlikely repositories to creatively trace new pathways to counteract this trajectory. From facial recognition technology using image analysis and biometrics, to synthesized imagery of generative AI platforms like Dall.e and Midjourney, photography is deeply embedded in the evolution of AI and its large language models. Metadata and captions can reinforce stereo(types) encoded into the fabric of our everyday lives. As we look into our “black mirrors” (phone screens), we must consider the social implications, serious concerns and dangers of AI as well as its earliest “memories,” including data from colonial photobooks.
